Sliced Regression for Dimension Reduction
نویسندگان
چکیده
By slicing the region of the response (Li, 1991, SIR) and applying local kernel regression (Xia et al., 2002, MAVE) to each slice, a new dimension reduction method is proposed. Compared with the traditional inverse regression methods, e.g. sliced inverse regression (Li, 1991), the new method is free of the linearity condition (Li, 1991) and enjoys much improved estimation accuracy. Compared with the direct estimation methods (e.g., MAVE), the new method is much more robust against extreme values and can capture the entire central subspace (Cook, 1998b, CS) exhaustively. To determine the CS dimension, a consistent crossvalidation (CV) criterion is developed. Extensive numerical studies including one real example confirm our theoretical findings.
منابع مشابه
A note on shrinkage sliced inverse regression
We employ Lasso shrinkage within the context of sufficient dimension reduction to obtain a shrinkage sliced inverse regression estimator, which provides easier interpretations and better prediction accuracy without assuming a parametric model. The shrinkage sliced inverse regression approach can be employed for both single-index and multiple-index models. Simulation studies suggest that the new...
متن کاملSufficient dimension reduction in regressions across heterogeneous subpopulations
Sliced inverse regression is one of the widely used dimension reduction methods. Chiaromonte and co-workers extended this method to regressions with qualitative predictors and developed a method, partial sliced inverse regression, under the assumption that the covariance matrices of the continuous predictors are constant across the levels of the qualitative predictor. We extend partial sliced i...
متن کاملLikelihood-based Sufficient Dimension Reduction
We obtain the maximum likelihood estimator of the central subspace under conditional normality of the predictors given the response. Analytically and in simulations we found that our new estimator can preform much better than sliced inverse regression, sliced average variance estimation and directional regression, and that it seems quite robust to deviations from normality.
متن کاملConsistency of regularized sliced inverse regression for kernel models
We develop an extension of the sliced inverse regression (SIR) framework for dimension reduction using kernel models and Tikhonov regularization. The result is a numerically stable nonlinear dimension reduction method. We prove consistency of the method under weak conditions even when the reproducing kernel Hilbert space induced by the kernel is infinite dimensional. We illustrate the utility o...
متن کاملSliced Inverse Moment Regression Using Weighted Chi-Squared Tests for Dimension Reduction∗
We propose a new class of dimension reduction methods using the first two inverse moments, called Sliced Inverse Moment Regression (SIMR). We develop corresponding weighted chi-squared tests for the dimension of the regression. Basically, SIMR are linear combinations of Sliced Inverse Regression (SIR) and the method using a new candidate matrix which is designed to recover the entire inverse se...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2008